A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

نویسنده

  • Thomas M. Cover
چکیده

Let P(i) = (1 0)B’ be a peobability assignment on the set of nonnegative integers where 0 is an arbitrary real number, 0 < 0 < 1. We show that an optimal binary source code for this probabiliiy assignment is constructed as follows. Let I be the integer satisfying 0’ + @+ 1 < 1 < 8’ + @-I and represent each nonnegative integer i as i = b’+ r when j = [i/l], the integer part of i/l, and r = [i] mod 1. Encode j by a onary code (i.e., j zeros followed by a single one), and encode r by a Huffman code, using codewords of length /log2 11, for r < 2L’os 1+11 I, and length [log* I] -t1 otherwise. An optimal code for the nonnegative integers is the concatenation of those two codes. Theorem 2: M jointly ergodic countable-alphabet stochastic processes can be sent separately at rates R,,R,,. . .,R, to a common receiver with arbitrarily small probability of error, if and only if & R, 1 H(X”), i E S 1 X”), i E SC) = H(X'l' $2) 9 ,e. -,XcM’) H(X”), iESC) (16) for all subsets S cl {1,2,. . . ,M}, where SC denotes the complement of S, and H denotes the entiopy of the set of processes indexed by S conditioned on the processes indexed by SC. We have used the obvious extension of (2) to define the entropy H. Note, in the particular case M = 3, that these equations coincide with the following equations exhibited by Wolf [6] for the independent identically distributed case: The Huffman source coding algorithm [1], [2] is a wellknown algorithm for encoding the letters of a finite source alphabet into a uniquely decipherable code of minimum expected codeword length. Since the algorithm operates by successively “merging” the least probable letters in the alphabet, it cannot be directly applied ts infinite source alphabets. In this correspondence, we show how the Huffman algorithm can be used indirectly to prove the optimality of a code for an infinite alphabet if one can guess what the code should be first, Naturally it is not always easy to guess the structure of an optimal code, but if the structure is simple enough, and if one starts with the simplest cases, guessing oftefi works. The particular case that we deal with here is that of the nonnegative integers with a geometric probability assignment, R, 2 H(X”’ j X”‘, Xc3’) R, r H(XC2’ I X”‘, X(j)) R, 2 H(X’“’ j X”‘, XC2’) R1 + R2 2 H(X’I’, XC2’ 1 XC3’) R, + R3 z H(X”‘, XC3’ j X”‘) R, -tR3 2 H(X”‘, XC3’ 1 X”‘) P(i) = (1 8)8’, i>O (1) for some arbitrary 8, 0 < B c 1. This particular distribution arises in run-length doding, where if one has an independent letter binary source, with 8 being the probability of a zerb, then P(i) is the probability of a run of i zeros. The distribution also arises in other ways such as encoding protocol information in data networks. RI + R, + R3 2 H(X”’ X”’ XC3’) 3 5 * (17)

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Analysis of 1-D Nested Lattice Quantization and Slepian-Wolf Coding for Wyner-Ziv Coding of i.i.d. Sources

Entropy coded scaler quantization (ECSQ) performs 1.53 dB from optimal lossy compression at high data rates. In this paper, we generalize this result for distributed source coding with side information. 1-D nested scaler quantization with Slepian-Wolf coding is compared to Wyner-Ziv coding of i.i.d. Gaussian sources at high data rates. We show that nested scaler quantization with Slepian-Wolf c...

متن کامل

"Real" Slepian-Wolf codes

We provide a novel achievability proof of the Slepian-Wolf theorem for i.i.d. sources over finite alphabets. We demonstrate that random codes that are linear over the real field achieve the classical Slepian-Wolf rate region. For finite alphabets we show that decoding is equivalent to solving an integer program. The techniques used may be of independent interest for code design for a wide class...

متن کامل

Approaching the Slepian-Wolf Limit with LDPC Coset Codes

We consider the Slepian-Wolf code design based on LDPC (low-density parity-check) coset codes. We derive the density evolution formula for the Slepian-Wolf coding, equipped with a concentration theorem. An intimate connection between the Slepian-Wolf coding and channel coding is established. Specifically we show that, under density evolution, each Slepian-Wolf source coding problem is equivalen...

متن کامل

Encoding of Functions of Correlated Sources

In this correspondence, we describe the achievable rate region for reliably recovering deterministic functions of correlated sources which have a finite alphabet. The method of proof is almost the same as that used to prove the Slepian-Wolf theorem.

متن کامل

Moderate Deviations Asymptotics for Streaming Compression of Correlated Sources

In this paper, we consider the problem of blockwise streaming compression of a pair of correlated sources, which we term streaming Slepian-Wolf coding. We study the moderate deviations regime in which the rate pairs of a sequence of codes converges, along a straight line, to various points on the boundary of the Slepian-Wolf region at a speed slower than the inverse square root of the blockleng...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 21  شماره 

صفحات  -

تاریخ انتشار 1975